Marginalization in Composed Probabilistic Models
نویسنده
چکیده
Composition of low-dimensional distribu tions, whose foundations were laid in the pa per published in the Proceedings of UAI'97 (Jirousek 1997), appeared to be an alterna tive apparatus to describe multidimensional probabilistic models. In contrast to Graphi cal Markov Models, which define multidimen sional distributions in a declarative way, this approach is rather procedural. Ordering of low-dimensional distributions into a proper sequence fully defines the respective compu tational procedure; therefore, a study of dif ferent types of generating sequences is one of the central problems in this field. Thus, it ap pears that an important role is played by spe cial sequences that are called perfect. Their main characterization theorems are presented in this paper. However, the main result of this paper is a solution to the problem of marginalization for general sequences. The main theorem describes a way to obtain a generating sequence that defines the model corresponding to the marginal of the distri bution defined by an arbitrary generating se quence. From this theorem the reader can see to what extent these computations are lo cal; i.e., the sequence consists of marginal dis tributions whose computation must be made by summing up over the values of the vari able eliminated (the paper deals with a finite model) .
منابع مشابه
Reasoning and Decisions in Probabilistic Graphical Models - A Unified Framework
OF THE DISSERTATION Reasoning and Decisions in Probabilistic Graphical Models – A Unified Framework By Qiang Liu Doctor of Philosophy in Computer Science University of California, Irvine, 2014 Prof. Alexander Ihler, Chair Probabilistic graphical models such as Markov random fields, Bayesian networks and decision networks (a.k.a. influence diagrams) provide powerful frameworks for representing a...
متن کاملInference by Reparameterization in Neural Population Codes
Behavioral experiments on humans and animals suggest that the brain performs probabilistic inference to interpret its environment. Here we present a new generalpurpose, biologically-plausible neural implementation of approximate inference. The neural network represents uncertainty using Probabilistic Population Codes (PPCs), which are distributed neural representations that naturally encode pro...
متن کاملMarginal inferential models
Inferential models (IMs) provide a general framework for prior-free, frequencycalibrated, posterior probabilistic inference. The fundamental idea is the use of unobservable auxiliary variables to describe the underlying uncertainty about the parameter of interest. When nuisance parameters are present, a marginalization step can reduce the dimension of the auxiliary variable, which in turn leads...
متن کامل10 - 708 : Probabilistic Graphical Models 10 - 708 , Spring 2014 13 : Variational Inference : Loopy Belief Propagation
The problem of probabilistic inference concerns answering queries about conditional and marginal probabilities in graphical models. Consider two disjoint subsets E and F of the nodes in a graphical model G. A query regarding marginal distribution p(xF ) can be calculated by the marginalization operation ∑ G\F p(x). A query regarding conditional distribution p(xF |xE) can be calculated by p(xF |...
متن کاملUsing Probabilistic-Risky Programming Models in Identifying Optimized Pattern of Cultivation under Risk Conditions (Case Study: Shoshtar Region)
Using Telser and Kataoka models of probabilistic-risky mathematical programming, the present research is to determine the optimized pattern of cultivating the agricultural products of Shoshtar region under risky conditions. In order to consider the risk in the mentioned models, time period of agricultural years 1996-1997 till 2004-2005 was taken into account. Results from Telser and Kataoka mod...
متن کامل